• word of the day

    tropical medicine

    tropical medicine - Dictionary definition and meaning for word tropical medicine

    Definition
    (noun) the branch of medicine that deals with the diagnosis and treatment of diseases that are found most often in tropical regions

Word used in video below:
text: raj is from india a tropical country
Download our Mobile App Today
Receive our word of the day
on Whatsapp